Hierarchical Orthogonal Factorization: Sparse Least Squares Problems

نویسندگان

چکیده

In this work, we develop a fast hierarchical solver for solving large, sparse least squares problems. We build upon the algorithm, spaQR (sparsified QR Gnanasekaran and Darve in SIAM J Matrix Anal Appl 43(1):94–123, 2022), that was developed by authors to solve large linear systems. Our algorithm is built on top of Nested Dissection based multifrontal approach. use low-rank approximations frontal matrices sparsify vertex separators at every level elimination tree. Using two-step sparsification scheme, reduce number columns maintain ratio rows each front without introducing any additional fill-in. With improvised show runtime scales as $$\mathcal {O}(M \log N)$$ uses {O}(M)$$ memory store factorization. This achieved expense small controllable approximation error. The end result an approximate factorization matrix stored sequence orthogonal upper-triangular factors hence easy apply/solve with vector. Numerical experiments compare performance direct QR, inner-outer iterative method, CGLS method diagonal preconditioner Robust Incomplete Factorization (RIF) preconditioner.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Fast Hierarchical Alternating Least Squares Algorithm for Orthogonal Nonnegative Matrix Factorization

Nonnegative Matrix Factorization (NMF) is a popular technique in a variety of fields due to its component-based representation with physical interpretablity. NMF finds a nonnegative hidden structures as oblique bases and coefficients. Recently, Orthogonal NMF (ONMF), imposing an orthogonal constraint into NMF, has been gathering a great deal of attention. ONMF is more appropriate for the cluste...

متن کامل

DMS: Distributed Sparse Tensor Factorization with Alternating Least Squares

Tensors are data structures indexed along three or more dimensions. Tensors have found increasing use in domains such as data mining and recommender systems where dimensions can have enormous length and are resultingly very sparse. The canonical polyadic decomposition (CPD) is a popular tensor factorization for discovering latent features and is most commonly found via the method of alternating...

متن کامل

Sparse recovery via Orthogonal Least-Squares under presence of Noise

We consider the Orthogonal Least-Squares (OLS) algorithm for the recovery of a m-dimensional k-sparse signal from a low number of noisy linear measurements. The Exact Recovery Condition (ERC) in bounded noisy scenario is established for OLS under certain condition on nonzero elements of the signal. The new result also improves the existing guarantees for Orthogonal Matching Pursuit (OMP) algori...

متن کامل

A Coarse-Grained Parallel QR-Factorization Algorithm for Sparse Least Squares Problems

A sparse QR-factorization algorithm SPARQR for coarse-grained parallel computations is described. The coeecient matrix, which is assumed to be general sparse, is reordered in an attempt to bring as many zero elements in the lower left corner as possible. The reordered matrix is then partitioned into block rows, and Givens plane rotations are applied in each block-row. These are independent task...

متن کامل

Incremental Cholesky Factorization for Least Squares Problems in Robotics

Online applications in robotics, computer vision, and computer graphics rely on efficiently solving the associated nolinear systems every step. Iteratively solving the non-linear system every step becomes very expensive if the size of the problem grows. This can be mitigated by incrementally updating the linear system and changing the linearization point only if needed. This paper proposes an i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Scientific Computing

سال: 2022

ISSN: ['1573-7691', '0885-7474']

DOI: https://doi.org/10.1007/s10915-022-01824-9